Extended LaSalle's Invariance Principle for Full-Range Cellular Neural Networks

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Extended LaSalle's Invariance Principle for Full-Range Cellular Neural Networks

In several relevant applications to the solution of signal processing tasks in real time, a cellular neural network (CNN) is required to be convergent, that is, each solution should tend toward some equilibrium point. The paper develops a Lyapunov method, which is based on a generalized version of LaSalle’s invariance principle, for studying convergence and stability of the differential inclusi...

متن کامل

Invariance principle, multifractional Gaussian processes and long-range dependence

This paper is devoted to establish an invariance principle where the limit process is a multifractional Gaussian process with a multifractional function which takes its values in (1/2,1). Some properties, such as regularity and local self-similarity of this process are studied. Moreover the limit process is compared to the multifractional Brownian motion. Résumé. Ce papier a pour but d’établir ...

متن کامل

AN EXTENDED FUZZY ARTIFICIAL NEURAL NETWORKS MODEL FOR TIME SERIES FORECASTING

Improving time series forecastingaccuracy is an important yet often difficult task.Both theoretical and empirical findings haveindicated that integration of several models is an effectiveway to improve predictive performance, especiallywhen the models in combination are quite different. In this paper,a model of the hybrid artificial neural networks andfuzzy model is proposed for time series for...

متن کامل

Robust Full Bayesian Methods for Neural Networks

Arnaud Doucet Cambridge University Engineering Department Cambridge CB2 1PZ England [email protected] In this paper, we propose a full Bayesian model for neural networks. This model treats the model dimension (number of neurons), model parameters, regularisation parameters and noise parameters as random variables that need to be estimated. We then propose a reversible jump Markov chain Monte Ca...

متن کامل

Quantifying Translation-Invariance in Convolutional Neural Networks

A fundamental problem in object recognition is the development of image representations that are invariant to common transformations such as translation, rotation, and small deformations. There are multiple hypotheses regarding the source of translation invariance in CNNs. One idea is that translation invariance is due to the increasing receptive field size of neurons in successive convolution ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: EURASIP Journal on Advances in Signal Processing

سال: 2009

ISSN: 1687-6180

DOI: 10.1155/2009/730968